Conjugate gradient acceleration techniques applied to electromagnetic scattering problems

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

متن کامل

Generalized Cross-Validation applied to Conjugate Gradient for discrete ill-posed problems

To apply the Generalized Cross-Validation (GCV) as a stopping rule for an iterative method, we must estimate the trace of the so-called influence matrix which appears in the denominator of the GCV function. In the case of conjugate gradient, unlike what happens with stationary iterative methods, the regularized solution has a nonlinear dependence on the noise which affects the data of the probl...

متن کامل

The conjugate gradient algorithm applied to quaternion - valued matrices

The well known conjugate gradient algorithm (cg-algorithm), introduced by Hestenes & Stiefel, [1952] intended for real, symmetric, positive definite matrices works as well for complex matrices and has the same typical convergence behavior. It will also work, not generally, but in many cases for hermitean, but not necessarily positive definite matrices. We shall show, that the same behavior is s...

متن کامل

Stochastic Proximal Gradient Descent with Acceleration Techniques

Proximal gradient descent (PGD) and stochastic proximal gradient descent (SPGD) are popular methods for solving regularized risk minimization problems in machine learning and statistics. In this paper, we propose and analyze an accelerated variant of these methods in the mini-batch setting. This method incorporates two acceleration techniques: one is Nesterov’s acceleration method, and the othe...

متن کامل

Acceleration of conjugate gradient algorithms for unconstrained optimization

Conjugate gradient methods are important for large-scale unconstrained optimization. This paper proposes an acceleration of these methods using a modification of steplength. The idea is to modify in a multiplicative manner the steplength k α , computed by Wolfe line search conditions, by means of a positive parameter k η , in such a way to improve the behavior of the classical conjugate gradien...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical and Computer Modelling

سال: 1989

ISSN: 0895-7177

DOI: 10.1016/0895-7177(89)90435-4